8 research outputs found

    Evolutionary connectionism: algorithmic principles underlying the evolution of biological organisation in evo-devo, evo-eco and evolutionary transitions

    Get PDF
    The mechanisms of variation, selection and inheritance, on which evolution by natural selection depends, are not fixed over evolutionary time. Current evolutionary biology is increasingly focussed on understanding how the evolution of developmental organisations modifies the distribution of phenotypic variation, the evolution of ecological relationships modifies the selective environment, and the evolution of reproductive relationships modifies the heritability of the evolutionary unit. The major transitions in evolution, in particular, involve radical changes in developmental, ecological and reproductive organisations that instantiate variation, selection and inheritance at a higher level of biological organisation. However, current evolutionary theory is poorly equipped to describe how these organisations change over evolutionary time and especially how that results in adaptive complexes at successive scales of organisation (the key problem is that evolution is self-referential, i.e. the products of evolution change the parameters of the evolutionary process). Here we first reinterpret the central open questions in these domains from a perspective that emphasises the common underlying themes. We then synthesise the findings from a developing body of work that is building a new theoretical approach to these questions by converting well-understood theory and results from models of cognitive learning. Specifically, connectionist models of memory and learning demonstrate how simple incremental mechanisms, adjusting the relationships between individually-simple components, can produce organisations that exhibit complex system-level behaviours and improve the adaptive capabilities of the system. We use the term “evolutionary connectionism” to recognise that, by functionally equivalent processes, natural selection acting on the relationships within and between evolutionary entities can result in organisations that produce complex system-level behaviours in evolutionary systems and modify the adaptive capabilities of natural selection over time. We review the evidence supporting the functional equivalences between the domains of learning and of evolution, and discuss the potential for this to resolve conceptual problems in our understanding of the evolution of developmental, ecological and reproductive organisations and, in particular, the major evolutionary transitions

    The evolution of evolvability: how evolution learns to evolve

    No full text
    It is hypothesised that one of the main reasons evolution has produced such a tremendous diversity of amazing designs is because evolution has improved its own ability to innovate, a process called the ‘evolution of evolvability’. Rupert Riedl, an early pioneer of evolutionary developmental biology, suggested that evolvability is facilitated by a specific developmental organisation that is itself a product of past selection. However, the construction of a theoretical framework to formalise such ‘evolution of evolvability’ has been continually frustrated by the indisputable fact that natural selection cannot favour structures for benefits they have not yet produced. Here we resolve this seeming paradox. Recent work shows that short-term selective pressures on gene interactions are functionally equivalent to a simple type of associative learning, well-understood in neural network research. This is important for the evolution of evolvability because thistype of learning system can clearly change in a way that improves its performance on unseen, future test cases, without the need for the future to cause the past. Recognising a formal link with the conditions that enable such predictive generalisation in machine learning systems unlocks well-established theory that can be applied to understanding the evolution of evolvability. Here we use this to elucidate, and demonstrate for the first time, conditions where short-term selective pressures alter evolutionary trajectories in a manner that systematically improves long-term evolutionary outcomes

    How evolution learns to generalise:: using the principles of learning theory to understand the evolution of developmental organisation

    No full text
    One of the most intriguing questions in evolution is how organisms exhibit suitable phenotypic variation to rapidly adapt in novel selective environments. Such variability is crucial for evolvability, but poorly understood. In particular, how can natural selection favour developmental organisations that facilitate adaptive evolution in previously unseen environments? Such a capacity suggests foresight that is incompatible with the short-sighted concept of natural selection. A potential resolution is provided by the idea that evolution may discover and exploit information not only about the particular phenotypes selected in the past, but their underlying structural regularities: new phenotypes, with the same underlying regularities, but novel particulars, may then be useful in new environments. If true, we still need to understand the conditions in which natural selection will discover such deep regularities rather than exploiting ‘quick fixes’ (i.e., fixes that provide adaptive phenotypes in the short term, but limit future evolvability). Here we argue that the ability of evolution to discover such regularities is formally analogous to learning principles, familiar in humans and machines, that enable generalisation from past experience. Conversely, natural selection that fails to enhance evolvability is directly analogous to the learning problem of over-fitting and the subsequent failure to generalise. We support the conclusion that evolving systems and learning systems are different instantiations of the same algorithmic principles by showing that existing results from the learning domain can be transferred to the evolution domain. Specifically, we show that conditions that alleviate over-fitting in learning systems successfully predict which biological conditions (e.g., environmental variation, regularity, noise or a pressure for developmental simplicity) enhance evolvability. This equivalence provides access to a well-developed theoretical framework from learning theory that enables a characterisation of the general conditions for the evolution of evolvability

    Pictorial representation of phenotypes.

    No full text
    <p>(Top) Schematic representation of mapping from phenotypic pattern sequences onto pictorial features. Each phenotypic ‘slot’ represents a set of features (here 4) controlling a certain aspect of the phenotype (e.g., front wings, halteres and antennae). Within the possible configurations in each slot (here 16), there are two particular configurations (state A and B) that are fit in some environment or another (see Developmental Model in <a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005358#pcbi.1005358.s001" target="_blank">S1 Appendix</a>). For example, ‘+ + −−’ in the second slot (from the top, green) of the phenotypic pattern encodes for a pair of front wings (state B), while ‘− − ++’ encodes for their absence (state A). States A and B are the complement of one another, i.e., not neighbours in phenotype space. All of the other intermediate states (here 14) are represented by a random mosaic image of state A and B, based on their respective distance. <i>d</i><sub><i>A</i></sub> indicates the Hamming distance between a given state and state A. Accordingly, there exist potential intermediate states (i.e., 4 for <i>d</i><sub><i>A</i></sub> = 1, 6 for <i>d</i><sub><i>A</i></sub> = 2 and 4 for <i>d</i><sub><i>A</i></sub> = 3). (Bottom) Pictorial representation of all phenotypes that are perfectly adapted to each of eight different environments. Each target phenotype is analogous to an insect-like organism comprised of 4 functional features. The grey phenotypic targets correspond to bit-wise complementary patterns of the phenotypes on the top half of the space. For example, in the rightmost, top insect, the antennae, forewings, and hindwings are present, and the tail is not. In the rightmost, bottom insect (the bitwise complement of the insect above it), the antennae, forewings, and hindwings are absent, but the tail is present. We define the top row as ‘the class’ and we disregard the bottom complements as degenerate forms of generalisation.</p

    How generalisation changes over evolutionary time.

    No full text
    <p>The match between phenotypic distributions generated by evolved GRN and the target phenotypes of selective environments the developmental system has been exposed to (training error) and all selective environments (test error) against evolutionary time for (A) moderate environmental switching, (B) noisy environments, (C) favouring weak connectivity and (D) favouring sparse connectivity. The vertical dashed line denotes when the ad-hoc technique of early stopping would be ideal, i.e. at the moment the problem of over-fitting begins. Favouring weak connectivity and jittering exhibits similar effects on test error as applying early stopping.</p

    Predictions made by porting key lessons of learning theory to evolutionary theory.

    No full text
    <p>Confirmed by experiment: † Conditions that facilitate generalised phenotypic distributions, ‡ How generalisation changes over evolutionary time, ◇ Conditions that facilitate generalised phenotypic distributions and ⋆ Sensitivity analysis to parameters affecting phenotypic generalisation.</p

    Conditions that facilitate generalised phenotypic distributions.

    No full text
    <p>Potential phenotypic distributions induced by the evolved developmental process under 1) different time-scales of environmental switching, 2) environmental noise (<i>κ</i> = 35 × 10<sup>−4</sup>) and 3) direct selection pressure for weak (<i>λ</i> = 38) and sparse connectivity (<i>λ</i> = 0.22). The organisms were exposed to three selective environments (a) from the general class (i). Developmental memorisation of past phenotypic targets clearly depends on the time-scale of environmental change. Noisy environments and parsimony pressures enhance the generalisation ability of development predisposing the production of previously unseen targets from the class. The size of the insect-like creatures describes relative frequencies and indicates the propensity of development to express the respective phenotype (phenotypes with frequency less than 0.01 were ignored). Note that the initial developmental structure represented all possible phenotypic patterns equally (here 2<sup>12</sup> possible phenotypes).</p

    Role of the strength of parsimony pressure and the level of environmental noise.

    No full text
    <p>The match between phenotypic distributions and the selective environments the network has been exposed to (training error) and all possible selective environments of the same class (generalisation error) for (A) noisy environments against parameter <i>κ</i> and under the parsimony pressure weak (B) and sparse (C) connectivity against parameter <i>λ</i>.</p
    corecore